New convergence results for the scaled gradient projection method
نویسندگان
چکیده
منابع مشابه
A Scaled Gradient Projection Method for Constrained Image Deblurring
A class of scaled gradient projection methods for optimization problems with simple constraints is considered. These iterative algorithms can be useful in variational approaches to image deblurring that lead to minimize convex nonlinear functions subject to nonnegativity constraints and, in some cases, to an additional flux conservation constraint. A special gradient projection method is introd...
متن کاملNew results on the convergence of the conjugate gradient method
This paper is concerned with proving theoretical results related to the convergence of the Conjugate Gradient method for solving positive definite symmetric linear systems. New relations for ratios of the A-norm of the error and the norm of the residual are provided starting from some earlier results of Sadok [13]. These results use the well-known correspondence between the Conjugate Gradient m...
متن کاملA scaled gradient projection method for Bayesian learning in dynamical systems
A crucial task in system identification problems is the selection of the most appropriate model class, and is classically addressed resorting to cross-validation or using order selection criteria based on asymptotic arguments. As recently suggested in the literature, this can be addressed in a Bayesian framework, where model complexity is regulated by few hyperparameters, which can be estimated...
متن کاملConstrained Stress Majorization Using Diagonally Scaled Gradient Projection
Constrained stress majorization is a promising new technique for integrating application specific layout constraints into forcedirected graph layout. We significantly improve the speed and convergence properties of the constrained stress-majorization technique for graph layout by employing a diagonal scaling of the stress function. Diagonal scaling requires the active-set quadratic programming ...
متن کاملNew Analysis and Results for the Conditional Gradient Method
We present new results for the conditional gradient method (also known as the Frank-Wolfe method). We derive computational guarantees for arbitrary step-size sequences, which are then applied to various step-size rules, including simple averaging and constant step-sizes. We also develop step-size rules and computational guarantees that depend naturally on the warm-start quality of the initial (...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Inverse Problems
سال: 2015
ISSN: 0266-5611,1361-6420
DOI: 10.1088/0266-5611/31/9/095008